9 research outputs found
Petri Net Based Modelling of a Career Syllabus
A syllabus is a set of courses, their prerequisites and a set of rules definingthe continuance of a student in an academic program. The manual verification of thepossible contradictions between prerequisites and the set of rules is a difficult taskdue to the large number of possible cases. In this article we present the differentstages in the verification of a syllabus, its modelling and analysis using Petri nets,and suggested ways in which this may be used by the university administration in thedecision making process
Next-Generation EU DataGrid Data Management Services
We describe the architecture and initial implementation of the
next-generation of Grid Data Management Middleware in the EU DataGrid (EDG)
project.
The new architecture stems out of our experience and the users requirements
gathered during the two years of running our initial set of Grid Data
Management Services. All of our new services are based on the Web Service
technology paradigm, very much in line with the emerging Open Grid Services
Architecture (OGSA). We have modularized our components and invested a great
amount of effort towards a secure, extensible and robust service, starting from
the design but also using a streamlined build and testing framework.
Our service components are: Replica Location Service, Replica Metadata
Service, Replica Optimization Service, Replica Subscription and high-level
replica management. The service security infrastructure is fully GSI-enabled,
hence compatible with the existing Globus Toolkit 2-based services; moreover,
it allows for fine-grained authorization mechanisms that can be adjusted
depending on the service semantics.Comment: Talk from the 2003 Computing in High Energy and Nuclear Physics
(CHEP03), La Jolla,Ca, USA, March 2003 8 pages, LaTeX, the file contains all
LaTeX sources - figures are in the directory "figures
Evaluation of an Economy-Based File Replication Strategy for a Data Grid
Optimising the use of Grid resources is critical for users to effectively exploit a Data Grid. Data replication is considered a major technique for reducing data access cost to Grid jobs. This paper evaluates a novel replication strategy, based on an economic model, that optimises both the selection of replicas for running jobs and the dynamic creation of replicas in Grid sites. In our model, optimisation agents are located on Grid sites and use an auction protocol for selecting the optimal replica of a data file and a prediction function to make informed decisions about local data replication. We evaluate our replication strategy with OptorSim, a Data Grid simulator developed by the authors. The experiments show that our proposed strategy results in a notable improvement over traditional replication strategies in a Grid environment.
Catallaxy-based Grid markets
Grid computing has recently become an important paradigm for managing computationally demanding applications,
composed of a collection of services. The dynamic discovery of services, and the selection of a particular service instance
providing the best value out of the discovered alternatives, poses a complex multi-attribute n:m allocation decision problem,
which is often solved using a centralized resource broker. To manage complexity, this article proposes a two-layer architecture
for service discovery in such Application Layer Networks (ALN). The first layer consists of a service market in which complex
services are translated to a set of basic services, which are distinguished by price and availability. The second layer provides an
allocation of services to appropriate resources in order to enact the specified services. This framework comprises the foundations
for a later comparison of centralized and decentralized market mechanisms for allocation of services and resources in ALNs and
Grids